Newton Methods for Nonsmooth Convex Minimization: Connections between U-Lagrangian, Riemannian Newton and SQP Methods

نویسندگان

  • Jérôme Malick
  • Scott A. Miller
چکیده

This paper studies Newton-type methods for minimization of partly smooth convex functions. Sequential Newton methods are provided using local parameterizations obtained from U-Lagrangian theory and from Riemannian geometry. The Hessian based on the ULagrangian depends on the selection of a dual parameter g; by revealing the connection to Riemannian geometry, a natural choice of g emerges for which the two Newton directions become identical. This choice of g is also shown to be related to the least-squares multiplier estimate from a sequential quadratic programming (SQP) approach, and with this multiplier, SQP gives the same search direction as the Newton methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Newton methods for nonsmooth convex minimization: connections among U-Lagrangian, Riemannian Newton and SQP methods

This paper studies Newton-type methods for minimization of partly smooth convex functions. Sequential Newton methods are provided using local parameterizations obtained from U -Lagrangian theory and from Riemannian geometry. The Hessian based on the U -Lagrangian depends on the selection of a dual parameter g; by revealing the connection to Riemannian geometry, a natural choice of g emerges for...

متن کامل

Forward-backward Truncated Newton Methods for Convex Composite Optimization1

This paper proposes two proximal Newton-CG methods for convex nonsmooth optimization problems in composite form. The algorithms are based on a a reformulation of the original nonsmooth problem as the unconstrained minimization of a continuously differentiable function, namely the forward-backward envelope (FBE). The first algorithm is based on a standard line search strategy, whereas the second...

متن کامل

On the Performance of SQP Methods for Nonlinear Optimization

This paper concerns some practical issues associated with the formulation of sequential quadratic programming (SQP) methods for large-scale nonlinear optimization. SQP methods find an approximate solution of a sequence of quadratic programming (QP) subproblems in which a quadratic model of the objective function is minimized subject to the linearized constraints. Extensive numerical results are...

متن کامل

A BFGS-SQP method for nonsmooth, nonconvex, constrained optimization and its evaluation using relative minimization profiles

We propose an algorithm for solving nonsmooth, nonconvex, constrained optimization problems as well as a new set of visualization tools for comparing the performance of optimization algorithms. Our algorithm is a sequential quadratic optimization method that employs BFGS quasi-Newton Hessian approximations and an exact penalty function whose parameter is controlled using a steering strategy. We...

متن کامل

BFGS convergence to nonsmooth minimizers of convex functions

The popular BFGS quasi-Newton minimization algorithm under reasonable conditions converges globally on smooth convex functions. This result was proved by Powell in 1976: we consider its implications for functions that are not smooth. In particular, an analogous convergence result holds for functions, like the Euclidean norm, that are nonsmooth at the minimizer.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004